55 research outputs found
Lorentz-violating vs ghost gravitons: the example of Weyl gravity
We show that the ghost degrees of freedom of Einstein gravity with a Weyl
term can be eliminated by a simple mechanism that invokes local Lorentz
symmetry breaking. We demonstrate how the mechanism works in a cosmological
setting. The presence of the Weyl term forces a redefinition of the quantum
vacuum state of the tensor perturbations. As a consequence the amplitude of
their spectrum blows up when the Lorentz-violating scale becomes comparable to
the Hubble radius. Such a behaviour is in sharp contrast to what happens in
standard Weyl gravity where the gravitational ghosts smoothly damp out the
spectrum of primordial gravitational waves.Comment: 14 pages, 3 figures, REVTeX 4.
Gravitational Waves Astronomy: a cornerstone for gravitational theories
Realizing a gravitational wave (GW) astronomy in next years is a great
challenge for the scientific community. By giving a significant amount of new
information, GWs will be a cornerstone for a better understanding of
gravitational physics. In this paper we re-discuss that the GW astronomy will
permit to solve a captivating issue of gravitation. In fact, it will be the
definitive test for Einstein's general relativity (GR), or, alternatively, a
strong endorsement for extended theories of gravity (ETG).Comment: To appear in Proceedings of the Workshop "Cosmology, the Quantum
Vacuum and Zeta Functions" for the celebration of Emilio Elizalde's sixtieth
birthday, Barcelona, March 8-10, 201
Gravitational Wave Astronomy: in Anticipation of First Sources to be Detected
The first generation of long-baseline laser interferometric detectors of
gravitational waves will start collecting data in 2001-2003. We carefully
analyse their planned performance and compare it with the expected strengths of
astrophysical sources. The scientific importance of the anticipated discovery
of various gravitatinal wave signals and the reliability of theoretical
predictions are taken into account in our analysis. We try to be conservative
both in evaluating the theoretical uncertainties about a source and the
prospects of its detection. After having considered many possible sources, we
place our emphasis on (1) inspiraling binaries consisting of stellar mass black
holes and (2) relic gravitational waves. We draw the conclusion that
inspiraling binary black holes are likely to be detected first by the initial
ground-based interferometers. We estimate that the initial interferometers will
see 2-3 events per year from black hole binaries with component masses
10-15M_\odot, with a signal-to-noise ratio of around 2-3, in each of a network
of detectors consisting of GEO, VIRGO and the two LIGOs. It appears that other
possible sources, including coalescing neutron stars, are unlikely to be
detected by the initial instruments. We also argue that relic gravitational
waves may be discovered by the space-based interferometers in the frequency
interval 2x10^{-3}-10^{-2} Hz, at the signal-to-noise ratio level around 3.Comment: latex, 100 pages, including 20 postscript figures. Small typos
corrected, references adde
Dynamical Mean-Field Theory within an Augmented Plane-Wave Framework: Assessing Electronic Correlations in the Iron Pnictide LaFeAsO
We present an approach that combines the local density approximation (LDA)
and the dynamical mean-field theory (DMFT) in the framework of the
full-potential linear augmented plane waves (FLAPW) method. Wannier-like
functions for the correlated shell are constructed by projecting local orbitals
onto a set of Bloch eigenstates located within a certain energy window. The
screened Coulomb interaction and Hund's coupling are calculated from a
first-principle constrained RPA scheme. We apply this LDA+DMFT implementation,
in conjunction with continuous-time quantum Monte-Carlo, to study the
electronic correlations in LaFeAsO. Our findings support the physical picture
of a metal with intermediate correlations. The average value of the mass
renormalization of the Fe 3d bands is about 1.6, in reasonable agreement with
the picture inferred from photoemission experiments. The discrepancies between
different LDA+DMFT calculations (all technically correct) which have been
reported in the literature are shown to have two causes: i) the specific value
of the interaction parameters used in these calculations and ii) the degree of
localization of the Wannier orbitals chosen to represent the Fe 3d states, to
which many-body terms are applied. The latter is a fundamental issue in the
application of many-body calculations, such as DMFT, in a realistic setting. We
provide strong evidence that the DMFT approximation is more accurate and more
straightforward to implement when well-localized orbitals are constructed from
a large energy window encompassing Fe-3d, As-4p and O-2p, and point out several
difficulties associated with the use of extended Wannier functions associated
with the low-energy iron bands. Some of these issues have important physical
consequences, regarding in particular the sensitivity to the Hund's coupling.Comment: 16 pages, 9 figures, published versio
Probing Quantum Geometry at LHC
We present an evidence, that the volumes of compactified spaces as well as
the areas of black hole horizons must be quantized in Planck units. This
quantization has phenomenological consequences, most dramatic being for micro
black holes in the theories with TeV scale gravity that can be produced at LHC.
We predict that black holes come in form of a discrete tower with well defined
spacing. Instead of thermal evaporation, they decay through the sequence of
spontaneous particle emissions, with each transition reducing the horizon area
by strictly integer number of Planck units. Quantization of the horizons can be
a crucial missing link by which the notion of the minimal length in gravity
eliminates physical singularities. In case when the remnants of the black holes
with the minimal possible area and mass of order few TeV are stable, they might
be good candidates for the cold dark matter in the Universe.Comment: 14 pages, Late
Slepian functions and their use in signal estimation and spectral analysis
It is a well-known fact that mathematical functions that are timelimited (or
spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the
finite precision of measurement and computation unavoidably bandlimits our
observation and modeling scientific data, and we often only have access to, or
are only interested in, a study area that is temporally or spatially bounded.
In the geosciences we may be interested in spectrally modeling a time series
defined only on a certain interval, or we may want to characterize a specific
geographical area observed using an effectively bandlimited measurement device.
It is clear that analyzing and representing scientific data of this kind will
be facilitated if a basis of functions can be found that are "spatiospectrally"
concentrated, i.e. "localized" in both domains at the same time. Here, we give
a theoretical overview of one particular approach to this "concentration"
problem, as originally proposed for time series by Slepian and coworkers, in
the 1960s. We show how this framework leads to practical algorithms and
statistically performant methods for the analysis of signals and their power
spectra in one and two dimensions, and on the surface of a sphere.Comment: Submitted to the Handbook of Geomathematics, edited by Willi Freeden,
Zuhair M. Nashed and Thomas Sonar, and to be published by Springer Verla
Quantum-squeezing effects of strained multilayer graphene NEMS
Quantum squeezing can improve the ultimate measurement precision by squeezing one desired fluctuation of the two physical quantities in Heisenberg relation. We propose a scheme to obtain squeezed states through graphene nanoelectromechanical system (NEMS) taking advantage of their thin thickness in principle. Two key criteria of achieving squeezing states, zero-point displacement uncertainty and squeezing factor of strained multilayer graphene NEMS, are studied. Our research promotes the measured precision limit of graphene-based nano-transducers by reducing quantum noises through squeezed states
Stochastic backgrounds of relic gravitons: a theoretical appraisal
Stochastic backgrounds or relic gravitons, if ever detected, will constitute
a prima facie evidence of physical processes taking place during the earliest
stages of the evolution of the plasma. The essentials of the stochastic
backgrounds of relic gravitons are hereby introduced and reviewed. The pivotal
observables customarily employed to infer the properties of the relic gravitons
are discussed both in the framework of the CDM paradigm as well as in
neighboring contexts. The complementarity between experiments measuring the
polarization of the Cosmic Microwave Background (such as, for instance, WMAP,
Capmap, Quad, Cbi, just to mention a few) and wide band interferometers (e.g.
Virgo, Ligo, Geo, Tama) is emphasized. While the analysis of the microwave sky
strongly constrains the low-frequency tail of the relic graviton spectrum,
wide-band detectors are sensitive to much higher frequencies where the spectral
energy density depends chiefly upon the (poorly known) rate of
post-inflationary expansion.Comment: 94 pages, 32 figure
Scalar and vector Slepian functions, spherical signal estimation and spectral analysis
It is a well-known fact that mathematical functions that are timelimited (or
spacelimited) cannot be simultaneously bandlimited (in frequency). Yet the
finite precision of measurement and computation unavoidably bandlimits our
observation and modeling scientific data, and we often only have access to, or
are only interested in, a study area that is temporally or spatially bounded.
In the geosciences we may be interested in spectrally modeling a time series
defined only on a certain interval, or we may want to characterize a specific
geographical area observed using an effectively bandlimited measurement device.
It is clear that analyzing and representing scientific data of this kind will
be facilitated if a basis of functions can be found that are "spatiospectrally"
concentrated, i.e. "localized" in both domains at the same time. Here, we give
a theoretical overview of one particular approach to this "concentration"
problem, as originally proposed for time series by Slepian and coworkers, in
the 1960s. We show how this framework leads to practical algorithms and
statistically performant methods for the analysis of signals and their power
spectra in one and two dimensions, and, particularly for applications in the
geosciences, for scalar and vectorial signals defined on the surface of a unit
sphere.Comment: Submitted to the 2nd Edition of the Handbook of Geomathematics,
edited by Willi Freeden, Zuhair M. Nashed and Thomas Sonar, and to be
published by Springer Verlag. This is a slightly modified but expanded
version of the paper arxiv:0909.5368 that appeared in the 1st Edition of the
Handbook, when it was called: Slepian functions and their use in signal
estimation and spectral analysi
The collapse of the wave function in the joint metric-matter quantization for inflation
It has been argued that the standard inflationary scenario suffers from a
serious deficiency as a model for the origin of the seeds of cosmic structure:
it can not truly account for the transition from an early homogeneous and
isotropic stage to another one lacking such symmetries. The issue has often
been thought as a standard instance of the "quantum measurement problem", but
as has been recently argued by some of us the situation reaches a critical
level in the cosmological context of interest here. This has lead to a proposal
in which the standard paradigm is supplemented by a hypothesis concerning the
self-induced dynamical collapse of the wave function, as representing the
physical mechanism through which such change of symmetry is brought forth. This
proposal was formulated within the context of semiclassical gravity. Here we
investigate an alternative realization of such idea implemented directly within
the standard analysis in terms of a quantum field jointly describing the
inflaton and metric perturbations, the so called Mukhanov-Sasaki variable. We
show that even though the prescription is quite different, the theoretical
predictions include some deviations from the standard ones, which are indeed
very similar to those found in the early studies. We briefly discuss the
differences between the two at both, the conceptual and practical levels.Comment: 31 pages, 6 figures. Replaced to match the published versio
- …